ML-As-3
Problem 1: Hard-margin SVM. (18 pts)
You are given the following two sets of data points, each belonging to one of the two classes (class 1 and class -1):
- Class 1 (labeled as +1):
- Class -1 (labeled as -1):
Please find the optimal separating hyperplane using a linear SVM and derive the equation of the hyperplane. Assume the hard-margin SVM.
1. Write down the formulation of SVM, including the separation hyperplane, the constraints and the final optimization problem with parameters. (4 pts)
The hyperplane is defined through
: weight vector : scalar
Subject to the constraint
is the class label of
Final optimization problem:
2. Write down the Lagrangian form for this problem using the parameters and Lagrange multipliers. Please also write out its dual form. (10 pts)
The Lagrangian form:
where
The dual form of the optimization problem
subject to
3. Assume that the Lagrangian multipliers ’s are all 0.5 and that the point is a support vector for ease of calculation. Please calculate the values of weight vector and bias . Write out the explicit form of the hyperplane. (4 pts)
If
Since the support vector is
The explicit form of the hyperplane.
Problem 2: Soft-margin SVM. (20 pts)
Suppose we have the data points
1. Write down the formulation of the soft-margin SVM. for this problem using , , , and . Write out explicitly their dimensions. (3 pts)
For a soft-margin SVM, the optimization problem can be formulated as follows:
subject to:
where:
is the weight vector, is the bias, is the vector of slack variables, and is the regularization parameter that controls the trade-off between maximizing the margin and minimizing the classification error.
Dimensions:
has dimension , has dimension , has dimension , is a scalar, has dimension .
2. Write down the Lagrangian form and derive the dual for the problem. Write down the detailed derivation steps. (12 pts)
The primal objective function is:
where
To derive the dual problem, we take the partial derivatives of
Partial derivative with respect to
Partial derivative with respect to
Partial derivative with respect to
By substituting
subject to:
3. Obtain the decision boundary. (3 pts)
The decision boundary is given by:
where
4. Explain why disappears in the dual. (2 pts)
In the dual formulation,
Problem 3: Kernel SVM. (17 pts)
Consider the following
We want to use the polynomial kernel
To solve Problem 3 on Kernel SVM, let’s go through each part step-by-step.
1. Compute the Kernel Matrix (6 pts)
The kernel matrix
Since
2. Set up the Dual Optimization Problem. You can use the results from Problem 2. (4 pts)
Using the results from Problem 2, the dual problem for a soft-margin SVM with a kernel function becomes:
subject to:
where
3. Suppose the Lagrange multipliers ’s are
and
The bias
where we can use
Substitute
Calculating each term in the summation:
Summing these values:
4. Classify a New Point using the learned kernel SVM model. (5 pts)
To classify the point
Let’s compute each
Now, calculate
Substitute the values:
Calculate each term:
Adding them up with
Since